Combining Classifiers with Meta Decision Trees
ثبت نشده
چکیده
منابع مشابه
A comparison of stacking with meta decision trees to other combining methods
Meta decision trees (MDTs) are a method for combining multiple classifiers. We present an integration of the algorithm MLC4.5 for learning MDTs into the Weka data mining suite. We compare classifier ensembles combined with MDTs to bagged and boosted decision trees, and to classifier ensembles combined with other methods: voting, grading, multi-scheme and stacking with multi-response linear regr...
متن کاملA comparison of stacking with MDTs to bagging, boosting, and other stacking methods
In this paper, we present an integration of the algorithm MLC4.5 for learning meta decision trees (MDTs) into the Weka data mining suite. MDTs are a method for combining multiple classifiers. Instead of giving a prediction, MDT leaves specify which classifier should be used to obtain a prediction. The algorithm is based on the C4.5 algorithm for learning ordinary decision trees. An extensive pe...
متن کاملAn Experimental Study of Methods Combining Multiple Classifiers - Diversified both by Feature Selection and Bootstrap Sampling
Ensemble approaches are learning algorithms that construct a set of classifiers and then classify new instances by combining their predictions. These approaches can outperform single classifiers on wide range of classification problems. In this paper we proposed an extension of the bagging classifier integrating it with feature subset selection. Moreover, we examined the usage of other methods ...
متن کاملA Comparison of Stacking with Meta Decision Trees to Bagging, Boosting, and Stacking with other Methods
Meta decision trees (MDTs) are a method for combining multiple classifiers. We present an integration of the algorithm MLC4.5 for learning MDTs into the Weka data mining suite. We compare classifier ensembles combined with MDTs to bagged and boosted decision trees, and to classifier ensembles combined with other methods: voting and stacking with three different meta-level classifiers (ordinary ...
متن کاملMinimal Cost Complexity Pruning of Meta-Classifiers
Integrating multiple learned classification models (classifiers) computed over large and (physically) distributed data sets has been demonstrated as an effective approach to scaling inductive learning techniques, while also boosting the accuracy of individual classifiers. These gains, however, come at the expense of an increased demand for run-time system resources. The final ensemble meta-clas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003